square algorithm
Fused Orthogonal Alternating Least Squares for Tensor Clustering
We introduce a multi-modes tensor clustering method that implements a fused version of the alternating least squares algorithm (Fused-Orth-ALS) for simultaneous tensor factorization and clustering. The statistical convergence rates of recovery and clustering are established when the data are a noise contaminated tensor with a latent low rank CP decomposition structure. Furthermore, we show that a modified alternating least squares algorithm can provably recover the true latent low rank factorization structure when the data form an asymmetric tensor with perturbation. Clustering consistency is also established. Finally, we illustrate the accuracy and computational efficient implementation of the Fused-Orth-ALS algorithm by using both simulations and real datasets.
Fused Orthogonal Alternating Least Squares for Tensor Clustering
We introduce a multi-modes tensor clustering method that implements a fused version of the alternating least squares algorithm (Fused-Orth-ALS) for simultaneous tensor factorization and clustering. The statistical convergence rates of recovery and clustering are established when the data are a noise contaminated tensor with a latent low rank CP decomposition structure. Furthermore, we show that a modified alternating least squares algorithm can provably recover the true latent low rank factorization structure when the data form an asymmetric tensor with perturbation. Clustering consistency is also established. Finally, we illustrate the accuracy and computational efficient implementation of the Fused-Orth-ALS algorithm by using both simulations and real datasets.
New methods with Radial Basis Functions part4(Machine Learning)
Abstract: Scattered data fitting is a frequently encountered problem for reconstructing an unknown function from given scattered data. Radial basis function (RBF) methods have proven to be highly useful to deal with this problem. We describe two quantum algorithms to efficiently fit scattered data based on globally and compactly supported RBFs respectively. For the globally supported RBF method, the core of the quantum algorithm relies on using coherent states to calculate the radial functions and a nonsparse matrix exponentiation technique for efficiently performing a matrix inversion. A quadratic speedup is achieved in the number of data over the classical algorithms.
Lazy Learning Meets the Recursive Least Squares Algorithm
Lazy learning is a memory-based technique that, once a query is re(cid:173) ceived, extracts a prediction interpolating locally the neighboring exam(cid:173) ples of the query which are considered relevant according to a distance measure. In this paper we propose a data-driven method to select on a query-by-query basis the optimal number of neighbors to be considered for each prediction. As an efficient way to identify and validate local models, the recursive least squares algorithm is introduced in the con(cid:173) text of local approximation and lazy learning. Furthermore, beside the winner-takes-all strategy for model selection, a local combination of the most promising models is explored. The method proposed is tested on six different datasets and compared with a state-of-the-art approach.
Variational approximations using Fisher divergence
Yang, Yue, Martin, Ryan, Bondell, Howard
Modern applications of Bayesian inference involve models that are sufficiently complex that the corresponding posterior distributions are intractable and must be approximated. The most common approximation is based on Markov chain Monte Carlo, but these can be expensive when the data set is large and/or the model is complex, so more efficient variational approximations have recently received considerable attention. The traditional variational methods, that seek to minimize the Kullback--Leibler divergence between the posterior and a relatively simple parametric family, provide accurate and efficient estimation of the posterior mean, but often does not capture other moments, and have limitations in terms of the models to which they can be applied. Here we propose the construction of variational approximations based on minimizing the Fisher divergence, and develop an efficient computational algorithm that can be applied to a wide range of models without conjugacy or potentially unrealistic mean-field assumptions. We demonstrate the superior performance of the proposed method for the benchmark case of logistic regression.
- Asia > Middle East > Jordan (0.04)
- North America > United States > North Carolina (0.04)
- North America > United States > New York (0.04)
Bayesian Extensions of Kernel Least Mean Squares
Park, Il Memming, Seth, Sohan, Van Vaerenbergh, Steven
The kernel least mean squares (KLMS) algorithm is a computationally efficient nonlinear adaptive filtering method that "kernelizes" the celebrated (linear) least mean squares algorithm. We demonstrate that the least mean squares algorithm is closely related to the Kalman filtering, and thus, the KLMS can be interpreted as an approximate Bayesian filtering method. This allows us to systematically develop extensions of the KLMS by modifying the underlying state-space and observation models. The resulting extensions introduce many desirable properties such as "forgetting", and the ability to learn from discrete data, while retaining the computational simplicity and time complexity of the original algorithm.
- North America > United States > Texas > Travis County > Austin (0.04)
- North America > United States > New Jersey (0.04)
- Europe > Spain (0.04)
- Europe > Finland > Uusimaa > Helsinki (0.04)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (0.93)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (0.69)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (0.68)
Learning, Regularization and Ill-Posed Inverse Problems
Rosasco, Lorenzo, Caponnetto, Andrea, Vito, Ernesto D., Odone, Francesca, Giovannini, Umberto D.
Many works have shown that strong connections relate learning from examples to regularization techniques for ill-posed inverse problems. Nevertheless by now there was no formal evidence neither that learning from examples could be seen as an inverse problem nor that theoretical results in learning theory could be independently derived using tools from regularization theory. In this paper we provide a positive answer to both questions. Indeed, considering the square loss, we translate the learning problem in the language of regularization theory and show that consistency results and optimal regularization parameter choice can be derived by the discretization of the corresponding inverse problem.
- North America > United States > New York (0.05)
- North America > United States > Massachusetts (0.04)
- North America > United States > District of Columbia > Washington (0.04)
- (3 more...)
Learning, Regularization and Ill-Posed Inverse Problems
Rosasco, Lorenzo, Caponnetto, Andrea, Vito, Ernesto D., Odone, Francesca, Giovannini, Umberto D.
Many works have shown that strong connections relate learning from examples to regularization techniques for ill-posed inverse problems. Nevertheless by now there was no formal evidence neither that learning from examples could be seen as an inverse problem nor that theoretical results in learning theory could be independently derived using tools from regularization theory. In this paper we provide a positive answer to both questions. Indeed, considering the square loss, we translate the learning problem in the language of regularization theory and show that consistency results and optimal regularization parameter choice can be derived by the discretization of the corresponding inverse problem.
- North America > United States > New York (0.05)
- North America > United States > Massachusetts (0.04)
- North America > United States > District of Columbia > Washington (0.04)
- (3 more...)
Learning, Regularization and Ill-Posed Inverse Problems
Rosasco, Lorenzo, Caponnetto, Andrea, Vito, Ernesto D., Odone, Francesca, Giovannini, Umberto D.
Many works have shown that strong connections relate learning from examples toregularization techniques for ill-posed inverse problems. Nevertheless bynow there was no formal evidence neither that learning from examples could be seen as an inverse problem nor that theoretical results in learning theory could be independently derived using tools from regularization theory.In this paper we provide a positive answer to both questions. Indeed, considering the square loss, we translate the learning problem in the language of regularization theory and show that consistency resultsand optimal regularization parameter choice can be derived by the discretization of the corresponding inverse problem.
- North America > United States > New York (0.05)
- North America > United States > Massachusetts (0.04)
- North America > United States > District of Columbia > Washington (0.04)
- (3 more...)
Lazy Learning Meets the Recursive Least Squares Algorithm
Birattari, Mauro, Bontempi, Gianluca, Bersini, Hugues
Lazy learning is a memory-based technique that, once a query is received, extracts a prediction interpolating locally the neighboring examples of the query which are considered relevant according to a distance measure. In this paper we propose a data-driven method to select on a query-by-query basis the optimal number of neighbors to be considered for each prediction. As an efficient way to identify and validate local models, the recursive least squares algorithm is introduced in the context of local approximation and lazy learning. Furthermore, beside the winner-takes-all strategy for model selection, a local combination of the most promising models is explored. The method proposed is tested on six different datasets and compared with a state-of-the-art approach.
- Europe > Belgium (0.05)
- North America > United States > New York > New York County > New York City (0.04)
- North America > United States > Massachusetts > Suffolk County > Boston (0.04)